On Learning Matrices with Orthogonal Columns or Disjoint Supports
نویسندگان
چکیده
We investigate new matrix penalties to jointly learn linear models with orthogonality constraints, generalizing the work of Xiao et al. [24] who proposed a strictly convex matrix norm for orthogonal transfer. We show that this norm converges to a particular atomic norm when its convexity parameter decreases, leading to new algorithmic solutions to minimize it. We also investigate concave formulations of this norm, corresponding to more aggressive strategies to induce orthogonality, and show how these penalties can also be used to learn sparse models with disjoint supports.
منابع مشابه
Stability and vibration analyses of tapered columns resting on one or two-parameter elastic foundations
This paper presents a generalized numerical method to evaluate element stiffness matrices needed for the free vibration and stability analyses of non-prismatic columns resting on one- or two-parameter elastic foundations and subjected to variable axial load. For this purpose, power series approximation is used to solve the fourth–order differential equation of non-prismatic columns with v...
متن کاملA Life ' s Work on Hadamard Matrices ,
1 Hadamard matrices in Space Communications One hundred years ago, in 1893, Jacques Hadamard 21] found square matrices of orders 12 and 20, with entries 1, which had all their rows (and columns) orthogonal. These matrices, X = (x ij), satissed the equality of the following inequality jdet Xj 2 n i=1 n X j=1 jx ij j 2 and had maximal determinant. Hadamard actually asked the question of matrices ...
متن کاملOn the optimal correction of inconsistent matrix equations $AX = B$ and $XC = D$ with orthogonal constraint
This work focuses on the correction of both the coecient and the right hand side matrices of the inconsistent matrix equations $AX = B$ and $XC = D$ with orthogonal constraint. By optimal correction approach, a general representation of the orthogonal solution is obtained. This method is tested on two examples to show that the optimal correction is eective and highly accurate.
متن کاملNearly positive matrices
Nearly positive matrices are nonnegative matrices which, when premultiplied by orthogonal matrices as close to the identity as one wishes, become positive. In other words, all columns of a nearly positive matrix are mapped simultaneously to the interior of the nonnegative cone by mutiplication by a sequence of orthogonal matrices converging to the identity. In this paper, nearly positive matric...
متن کاملFrom random matrices to quasi-periodic Jacobi matrices via orthogonal polynomials
We present an informal review of results on asymptotics of orthogonal polynomials, stressing their spectral aspects and similarity in two cases considered. They are polynomials orthonormal on a finite union of disjoint intervals with respect to the Szegö weight and polynomials orthonormal on R with respect to varying weights and having the same union of intervals as the set of oscillations of a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014